Bayesian Functional Linear Regression with Sparse Step Functions
نویسندگان
چکیده
منابع مشابه
Robust Estimation in Linear Regression with Molticollinearity and Sparse Models
One of the factors affecting the statistical analysis of the data is the presence of outliers. The methods which are not affected by the outliers are called robust methods. Robust regression methods are robust estimation methods of regression model parameters in the presence of outliers. Besides outliers, the linear dependency of regressor variables, which is called multicollinearity...
متن کاملBayesian Sparse Linear Regression with Unknown Symmetric Error
We study full Bayesian procedures for sparse linear regression when errors have a symmetric but otherwise unknown distribution. The unknown error distribution is endowed with a symmetrized Dirichlet process mixture of Gaussians. For the prior on regression coefficients, a mixture of point masses at zero and continuous distributions is considered. We study behavior of the posterior with divergin...
متن کاملAn Empirical Analysis of Algorithms for Bayesian Sparse Linear Regression
Two conceptually similar algorithms for performing sparse linear regression under a Bayesian framework are presented. A novel implementation of a third algorithm is adapted from the log-linear regression task and applied to the linear regression model. The algorithms differ in terms of the search strategy used to explore the space of possible active variables; one uses a greedy approach, while ...
متن کاملConditional Sparse Linear Regression
Machine learning and statistics typically focus on building models that capture the vast majority of the data, possibly ignoring a small subset of data as “noise” or “outliers.” By contrast, here we consider the problem of jointly identifying a significant (but perhaps small) segment of a population in which there is a highly sparse linear regression fit, together with the coefficients for the ...
متن کاملOnline Sparse Linear Regression
We consider the online sparse linear regression problem, which is the problem of sequentially making predictions observing only a limited number of features in each round, to minimize regret with respect to the best sparse linear regressor, where prediction accuracy is measured by square loss. We give an inefficient algorithm that obtains regret bounded by Õ( √ T ) after T prediction rounds. We...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Bayesian Analysis
سال: 2019
ISSN: 1936-0975
DOI: 10.1214/18-ba1095